The metric property of the quantum Jensen-Shannon divergence

نویسندگان

چکیده

In this short note, we prove that the square root of quantum Jensen-Shannon divergence is a true metric on cone positive matrices, and hence in particular state space.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Metric character of the quantum Jensen-Shannon divergence

P. W. Lamberti, A. P. Majtey, A. Borras, M. Casas, and A. Plastino Facultad de Matemática, Astronomía y Física, Universidad Nacional de Córdoba, Ciudad Universitaria, 5000 Córdoba, and CONICET, C.C. 727, La Plata 1900, Argentina Departament de Física and IFISC, Universitat de les Illes Balears, 07122 Palma de Mallorca, Spain Instituto de Física La Plata, Universidad Nacional de La Plata and CON...

متن کامل

Manifold Learning and the Quantum Jensen-Shannon Divergence Kernel

The quantum Jensen-Shannon divergence kernel [1] was recently introduced in the context of unattributed graphs where it was shown to outperform several commonly used alternatives. In this paper, we study the separability properties of this kernel and we propose a way to compute a low-dimensional kernel embedding where the separation of the different classes is enhanced. The idea stems from the ...

متن کامل

Attributed Graph Similarity from the Quantum Jensen-Shannon Divergence

One of the most fundamental problem that we face in the graph domain is that of establishing the similarity, or alternatively the distance, between graphs. In this paper, we address the problem of measuring the similarity between attributed graphs. In particular, we propose a novel way to measure the similarity through the evolution of a continuous-time quantum walk. Given a pair of graphs, we ...

متن کامل

Nonextensive Generalizations of the Jensen-Shannon Divergence

Convexity is a key concept in information theory, namely via the many implications of Jensen’s inequality, such as the non-negativity of the Kullback-Leibler divergence (KLD). Jensen’s inequality also underlies the concept of Jensen-Shannon divergence (JSD), which is a symmetrized and smoothed version of the KLD. This paper introduces new JSD-type divergences, by extending its two building bloc...

متن کامل

Non-parametric Jensen-Shannon Divergence

Quantifying the difference between two distributions is a common problem in many machine learning and data mining tasks. What is also common in many tasks is that we only have empirical data. That is, we do not know the true distributions nor their form, and hence, before we can measure their divergence we first need to assume a distribution or perform estimation. For exploratory purposes this ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in Mathematics

سال: 2021

ISSN: ['1857-8365', '1857-8438']

DOI: https://doi.org/10.1016/j.aim.2021.107595